Approximating continuous functions by holomorphic and harmonic functions
نویسندگان
چکیده
منابع مشابه
Approximating Continuous Functions by Holomorphic and Harmonic Functions
If Q is a Widom domain in the plane (e.g., finitely connected) and / is any bounded harmonic function on Q which is not holomorphic, then we prove the algebra H°°(Sl)[f] contains all the uniformly continuous functions on ÎÎ . The basic tools are the solution of the 5 equation with L°° estimates and some estimates on the level sets of functions in BMOA.
متن کاملSome notes on harmonic and holomorphic functions
These notes are concerned with harmonic and holomorphic functions on Euclidean spaces, where “holomorphic” refers to ordinary complex analysis in dimension 2 and generalizations using quaternions and Clifford algebras in higher dimensions. Among the principal themes are weak solutions, the mean-value property, and subharmonicity.
متن کاملApproximating continuous functions by iterated function systems and optimization problems
In this paper some new contractive operators on C([a, b]) of IFS type are built. Inverse problems are introduced and studied by convex optimization problems. A stability result and some optimality conditions are given. AMS Subject Classification: 28A80, 41A20
متن کاملHolomorphic Diffusions and Boundary Behavior of Harmonic Functions
Cornell University We study a family of differential operators Lα α ≥ 0 in the unit ball D of Cn with n ≥ 2 that generalize the classical Laplacian, α = 0, and the conformal Laplacian, α = 1/2 (that is, the Laplace–Beltrami operator for Bergman metric in D). Using the diffusion processes associated with these (degenerate) differential operators, the boundary behavior of Lα-harmonic functions is...
متن کاملApproximating Continuous Functions by ReLU Nets of Minimal Width
This article concerns the expressive power of depth in deep feed-forward neural nets with ReLU activations. Specifically, we answer the following question: for a fixed d ≥ 1, what is the minimal width w so that neural nets with ReLU activations, input dimension d, hidden layer widths at most w, and arbitrary depth can approximate any continuous function of d variables arbitrarily well. It turns...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Transactions of the American Mathematical Society
سال: 1989
ISSN: 0002-9947
DOI: 10.1090/s0002-9947-1989-0961619-2